Entropic Priors *

نویسنده

  • Roland Preuss
چکیده

The method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting “entropic prior” is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. As an example the entropic prior for a Gaussian likelihood is calculated.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Consistency of Sequence Classification with Entropic Priors

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to “objective” prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work ...

متن کامل

Data Fusion with Entropic Priors

In classification problems, lack of knowledge of the prior distribution may make the application of Bayes’ rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, via application of the maximum entropy principle, seem to provide a much better answer and can ...

متن کامل

Entropic Priors for Discrete Probabilistic Networks and for Mixtures of Gaussians Models

The ongoing unprecedented exponential explosion of available computing power, has radically transformed the methods of statistical inference. What used to be a small minority of statisticians advocating for the use of priors and a strict adherence to bayes theorem, it is now becoming the norm across disciplines. The evolutionary direction is now clear. The trend is towards more realistic, flexi...

متن کامل

Approximate Maximum A Posteriori Inference with Entropic Priors

In certain applications it is useful to fit multinomial distributions to observed data with a penalty term that encourages sparsity. For example, in probabilistic latent audio source decomposition one may wish to encode the assumption that only a few latent sources are active at any given time. The standard heuristic of applying an L1 penalty is not an option when fitting the parameters to a mu...

متن کامل

Probabilistic Factorization of Non-negative Data with Entropic Co-occurrence Constraints

Abstract. In this paper we present a probabilistic algorithm which factorizes non-negative data. We employ entropic priors to additionally satisfy that user specified pairs of factors in this model will have their cross entropy maximized or minimized. These priors allow us to construct factorization algorithms that result in maximally statistically different factors, something that generic non-...

متن کامل

Bayesian Inference Featuring Entropic Priors

The subject of this work is the parametric inference problem, i.e. how to infer from data on the parameters of the data likelihood of a random process whose parametric form is known a priori. The assumption that Bayes’ theorem has to be used to add new data samples reduces the problem to the question of how to specify a prior before having seen any data. For this subproblem three theorems are s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003